Training Samples in Objective Bayesian Model Selection
نویسندگان
چکیده
Central to several objective approaches to Bayesian model selection is the use of training samples (subsets of the data), so as to allow utilization of improper objective priors. The most common prescription for choosing training samples is to choose them to be as small as possible, subject to yielding proper posteriors; these are called minimal training samples. When data can vary widely in terms of either information content or impact on the improper priors, use of minimal training samples can be inadequate. Important examples include certain cases of discrete data, the presence of censored observations, and certain situations involving linear models and explanatory variables. Such situations require more sophisticated methods of choosing training samples. A variety of such methods are developed in this paper, and successfully applied in challenging situations.
منابع مشابه
Training Samples in Objective Bayesian Model Selection 1
Central to several objective approaches to Bayesian model selection is the use of training samples (subsets of the data), so as to allow utilization of improper objective priors. The most common prescription for choosing training samples is to choose them to be as small as possible, subject to yielding proper posteriors; these are called minimal training samples. When data can vary widely in te...
متن کاملProject Portfolio Risk Response Selection Using Bayesian Belief Networks
Risk identification, impact assessment, and response planning constitute three building blocks of project risk management. Correspondingly, three types of interactions could be envisioned between risks, between impacts of several risks on a portfolio component, and between several responses. While the interdependency of risks is a well-recognized issue, the other two types of interactions remai...
متن کاملObjective Bayesian Analysis of Multiple Changepoints for Linear Models
This paper deals with the detection of multiple changepoints for independent but non identically distributed observations, which are assumed to be modeled by a linear regression with normal errors. The problem has a natural formulation as a model selection problem and the main difficulty for computing model posterior probabilities is that neither the reference priors nor any form of empirical B...
متن کاملPower-Expected-Posterior Priors for Variable Selection in Gaussian Linear Models
Imaginary training samples are often used in Bayesian statistics to develop prior distributions, with appealing interpretations, for use in model comparison. Expected-posterior priors are defined via imaginary training samples coming from a common underlying predictive distribution m, using an initial baseline prior distribution. These priors can have subjective and also default Bayesian implem...
متن کاملیک مدل بیزی برای استخراج باناظر گرامر زبان طبیعی
In this paper, we show that the problem of grammar induction could be modeled as a combination of several model selection problems. We use the infinite generalization of a Bayesian model of cognition to solve each model selection problem in our grammar induction model. This Bayesian model is capable of solving model selection problems, consistent with human cognition. We also show that using th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2002